Stephanie Kobakian
18 July 2016
This beginning involves testing how well currently available facial recognition software performs at identifying faces of the two players in a tennis match.
This presents issues as our use on Tennis Broadcasts provides multiple interchanging angles. This is unlike the intended use of the softwares as a full frontal scan at an access point.
105 Singles Matches from 2016 AO differing by:
We then took:
The following slides include stills and what was recognised when the softwares were applied.
The software that is chosen should be able to recognise this face even though it is not a front on angle.
The software that is chosen should also be able to recognise this face.
We would hope that this angle would still allow for accurate recognition, however it was only recognised by one software.
It would be unexpected that this size would still allow for recognition. It was only recognised by one software, note that the minimum pixel distance that will allow for recognition is 36.
Only Animetrics was able to recognise this face, these shaded angles should be a high priority for recognition.
As this is the intended future use we would need the software to detect emotions and angles such as this.
This angle should also be detected by any software we would consider for this use.
This angle should also be detected by any software we would consider for this use.
It was surprising that the softwares were not able to detect this face. Profile shots should also be distinguishable.
The glasses and facial features may have impacted the lack of recognition, this will be investigated further.
The glasses, facial features and face angle in this still frame may have impacted the lack of recognition, this will also be investigated further.
Skybiometry was able to identify this ‘face’, created by the creases in the shirt.
Animetrics was able to identify two faces in this image, one being a towel, the sensitivity will be investigated further.
Animetrics was able to correctly identify this face, twice. This presents an issue as the algorithm did not identify this.
Animetrics also presented an interesting recognition on the ‘face’ despite it not being the intended face for recognition.
In looking at the crowd shots insights can be gained as to how to programs contrast in their abilities. Skybiometry showed interesting recognition on a fist and Animetrics only found one face.
This showed that the softwares struggled during a transition.
Where the previous image only on software found one face. In this image all three softwares were able to locate both faces. This contrast gives insight into capabilities.
Some shots result in difficulties when applying recognition software
This was difficult for recognition softwares that were intended for recognition of clear and well-lit faces. ### Birdseye View This was difficult as the range of the angles meant that faces were too small and could not be distinguished.
It was hoped that these situations would be able to be used in recognition as it is at these moments that emotions would be able to effect gameplay.
This view is common and completely impractical. There are no recognisable faces in this image.
It was expected that when faces are covered by objects used by players courtside that the recognition would be stunted. However it was unexpected that the recognition would be found on the towel. It was also unexpected that the face in the crowd would be recognised.
Our Gold Standard to compare to the softwares selected
The information collected by myself and the three softwares differed. We collated it so that comparable information was captured in a single data frame for analysis. This allows us to create the mapped face boxes as seen in the previous images.
Two softwares both distinguished a face in the creases of the shirt.
The different colours detail the faces found by all three softwares. It shows that only one software considered the Adidas shirt a face, and glasses tended to result in only two softwares recognising the face.